Ghost in the machine: Can an AI version of the dead help us mourn?
1 month ago | 5 Views
In the quiet solitude of a late evening, a mother sits at her computer, typing messages to her dead daughter. But here’s a twist: The dead daughter is texting back too. It’s not magic. The mother is using an AI-powered chatbot that has scanned through the daughter’s archives of texts, comments and audio clips, and has learnt to respond just like her – down to the speech patterns and catchphrases. Is she real? No. But she’s a dead ringer, just similar enough to help Mum grieve.
It didn’t take us long to get comfortable with AI. Just last year, we were excited to ask machines to re-imagine Game of Thrones characters in Indian avatars. This year, we’re using the Kling AI app to insert our current selves into our childhood photos and hug our younger selves. For those mourning the loss of a loved one, AI has been so busy, there’s already a word for the business: Grief Tech. And as with all AI, some folks have jumped right in, while the rest are still figuring out if it’s a good idea.
RiP.gif
For as long as people have been dying, the living have sought ways to maintain a connection with the ones they’ve lost. We etched them on rock art. We painted flattering portraits. We tried seances. We kept their social media accounts active. AI, then, is merely another tool to process grief. It generates digital avatars, chatbots known as “thanobots,” even virtual-reality experiences, so we can interact with loved ones again.
It works the way most machine-learning software does. A bot scans, reads and remembers vast troves of a dead person’s digital footprint. The selfies, video clips, texts, emails, social media posts help build eerily accurate simulations for paying customers to interact with.
We’ve seen this in fiction already. An episode of Black Mirror is about a woman mourning her boyfriend, who reluctantly tries new tech to communicate with an AI version of him. Eternal You and Love Machina, two documentaries that premiered at the Sundance Film Festival this year, explore how the bots challenge the idea of someone gone forever.
But it’s the business end where all the buzz is. StoryFile started off preserving Holocaust survivor stories, using AI to transform videos into lifelike, interactive conversations. The firm has used the same tech to get grieving folks to interact with deceased family members via a chatbot. Another firm, You, Only Virtual (YOV), mimics a dead person’s personality through communication datasets, to do much the same thing.
“These innovations can offer profound comfort to those grappling with loss, providing a way to say the unsaid and achieve closure,” says Dr Sachin Baliga, consultant-psychiatry at Fortis Hospital, Bengaluru. But what happens when these digital doppelgangers start saying things the real person never would? Would it impede the grieving process, trapping people in an endless loop of digital interaction instead of helping them move on? “Utilising any tool for grief management is only effective if the individual is not in a denial phase. Or it can lead to depression, psychosis, and even hallucinations,” warns Dr Baliga.
Dr Aarushi Dewan, clinical psychologist and founder of Coping Keys in Delhi, says that grief therapy – talking to a trained professional instead of an archive-based sim, is a healthier way to mourn. “AI tools designed for grief may inadvertently prolong the denial stage. Users might become addicted to these technologies. It’s a slippery slope.”
Save, view, delete
Prabhanjan Dwivedi, senior technical officer at India’s Biotechnology Industry Research Assistance Council, says that grief tech is groundbreaking, because it offers “a way to keep a person’s legacy alive, allowing future generations to know their ancestors in a more intimate way than through photos and recordings,” he says.
It will never be the real thing – we’re more than the sum of our digital detritus, after all. An AI simulation lacks the nuance and surprise of genuine human interaction. “In situations where customers seek specific solutions or guidance from the AI, the responses might be influenced by the limitations or gaps in the data, potentially leading to less accurate or logical outcomes,” says Dwivedi. Imagine asking a dead relative for forgiveness or a password, via AI.
And it might not be how a loved one might want to be remembered in the first place. Critics warn of the potential for exploitation, dubbing the phenomenon Death Capitalism. “When companies charge for services that replicate a deceased loved one, it can feel as though they are capitalising on the pain and longing of the bereaved,” says Dwivedi.
The data used to train these AI models—ranging from text messages to social media posts—often contains sensitive and personal information. “There is a risk that this data could be misused, either by the companies creating the AI or by third parties who gain access to it,” he explains.
We’re only beginning to address the legal angle with respect to consent and data privacy. “Whether the deceased person gave consent to digital likeness or data being used to create a replica will remain the point of litigation,” says lawyer and jurist Abhishek A Rastogi. “The applicability of inheritance laws will come into play to examine who has the right to grant that consent after the death.” Even without them, would you really want someone to keep you going based on your memes and sexts?